Principal Component Analysis Based on Robust Estimators of the Covariance or Correlation Matrix: Innuence Functions and Eeciencies
نویسندگان
چکیده
A robust principal component analysis can be easily performed by computing the eigenvalues and eigenvectors of a robust estimator of the covariance or correlation matrix. In this paper we derive the innuence functions and the corresponding asymptotic variances for these robust estimators of eigenvalues and eigenvectors. The behavior of several of these estimators is investigated by a simulation study. Finally, the use of empirical innuence functions is illustrated by a real data example.
منابع مشابه
The Influence Function of Stahel-donoho Type Methods for Robust Covariance Estimation and Pca
Principal component analysis (PCA) is a popular technique to reduce the dimension of the data at hand. Since PCA is based on the empirical variance-covariance matrix, the estimates can be severely damaged by outliers. To reduce these effects, several robust methods were developed, mostly by replacing the classical variance-covariance matrix by a robust version. In this paper we focus on Stahel-...
متن کاملMultichannel signal processing using spatial rank covariance matrices
This paper addresses the problem of estimating the covariance matrix reliably when the assumptions, such as Gaussianity, on the probabilistic nature of multichannel data do not necessarily hold. Multivariate spatial sign and rank functions, which are generalizations of univariate sign and centered rank, are introduced. Furthermore, spatial rank covariance matrix and spatial Kendall’s tau covari...
متن کاملEIGENVECTORS OF COVARIANCE MATRIX FOR OPTIMAL DESIGN OF STEEL FRAMES
In this paper, the discrete method of eigenvectors of covariance matrix has been used to weight minimization of steel frame structures. Eigenvectors of Covariance Matrix (ECM) algorithm is a robust and iterative method for solving optimization problems and is inspired by the CMA-ES method. Both of these methods use covariance matrix in the optimization process, but the covariance matrix calcula...
متن کاملCompression of Breast Cancer Images By Principal Component Analysis
The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors ei ∈ RN of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of X in RN carry the most relevant information of X. These eigenvectors are called principal components [8]. Ass...
متن کاملAsymptotic distributions of principal components based on robust dispersions
Algebraically, principal components can be defined as the eigenvalues and eigenvectors of a covariance or correlation matrix, but they are statistically meaningful as successive projections of the multivariate data in the direction of maximal variability. An attractive alternative in robust principal component analysis is to replace the classical variability measure, i.e. variance, by a robust ...
متن کامل